If (Y1,…,Yn)′ is a random vector from a density fθ where θ is an unknown parameter, and y is a vector of observations then we define the likelihood function to be
If x1,…,xn are assumed to be observations of independent random variables with a normal distributions and mean of μ and variance of σ2, then the joint density is
f(x1)⋅f(x2)⋅…⋅f(xn)
=2πσ1e−2σ2(x1−μ)2⋅…⋅2πσ1e−2σ2(xn−μ)2
=Πi=1n2πσ1e−2σ2(xi−μ)2
=(2π)n/2σn1e−2σ21i=1∑N(xi−μ)2
and if we assume σ2 is known then the likelihood function is
L(μ)=(2π)n/2σn1e−2σ21Σi=1N(xi−μ)2
Maximizing this is done by maximizing the log, i.e. finding the μ for which:
If (Y1,…,Yn)′ is a random vector from a density fθ where θ is an unknown parameter, and y is a vector of observations then we define the likelihood function to be
Recall that X1,…,Xn are random varibles (reflecting the population distribution) and x1,…,xn are numerical outcomes of these distributions.
We use upper case letters to denote random variables and lower case letters to denote outcome or data.
Let the mean of a population be zero and the σ=4.
Then draw three samples from this population with size, n, either 4, 16 or 64.
The sample mean Xˉ will have a distribution with mean zero and standard deviation of nσ where n=4, 16 or 64.
The following R commands can be used to generate a distribution for the estimator β^
library(MASS) nsim <- 1000 betahat <- NULL for (i in 1:nsim) { n <- 20 x <- seq(1:n) # Fixed x vector y <- 2 + 0.4*x + rnorm(n, 0, 1) xbar <- mean(x) ybar <- mean(y) b <- sum((x-xbar)*(y-ybar))/sum((x-xbar)^2) a <- ybar - b * xbar betahat <- c(betahat, b) } truehist(betahat)